119 research outputs found

    A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing

    Full text link
    Compared to traditional distributed computing environments such as grids, cloud computing provides a more cost-effective way to deploy scientific workflows. Each task of a scientific workflow requires several large datasets that are located in different datacenters from the cloud computing environment, resulting in serious data transmission delays. Edge computing reduces the data transmission delays and supports the fixed storing manner for scientific workflow private datasets, but there is a bottleneck in its storage capacity. It is a challenge to combine the advantages of both edge computing and cloud computing to rationalize the data placement of scientific workflow, and optimize the data transmission time across different datacenters. Traditional data placement strategies maintain load balancing with a given number of datacenters, which results in a large data transmission time. In this study, a self-adaptive discrete particle swarm optimization algorithm with genetic algorithm operators (GA-DPSO) was proposed to optimize the data transmission time when placing data for a scientific workflow. This approach considered the characteristics of data placement combining edge computing and cloud computing. In addition, it considered the impact factors impacting transmission delay, such as the band-width between datacenters, the number of edge datacenters, and the storage capacity of edge datacenters. The crossover operator and mutation operator of the genetic algorithm were adopted to avoid the premature convergence of the traditional particle swarm optimization algorithm, which enhanced the diversity of population evolution and effectively reduced the data transmission time. The experimental results show that the data placement strategy based on GA-DPSO can effectively reduce the data transmission time during workflow execution combining edge computing and cloud computing

    Stochastic Optimization: Theory and Applications

    Get PDF
    As an important branch of applied mathematics, optimization theory, especially stochastic optimization, becomes an important tool for solving multiobjective decision-making problems in random process recently. Many kinds of industrial, biological, engineering, and economic problems can be viewed as stochastic systems, for example, area of communication, gene, signal processing, geography, civil engineering, aerospace, banking, and so forth. Stochastic optimization is suitable to solve the decision-making problems in these stochastic systems

    Foreword and editorial: International journal of security and its applications

    Full text link

    The Study on Stage Financing Model of IT Project Investment

    Get PDF
    Stage financing is the basic operation of venture capital investment. In investment, usually venture capitalists use different strategies to obtain the maximum returns. Due to its advantages to reduce the information asymmetry and agency cost, stage financing is widely used by venture capitalists. Although considerable attentions are devoted to stage financing, very little is known about the risk aversion strategies of IT projects. This paper mainly addresses the problem of risk aversion of venture capital investment in IT projects. Based on the analysis of characteristics of venture capital investment of IT projects, this paper introduces a real option pricing model to measure the value brought by the stage financing strategy and design a risk aversion model for IT projects. Because real option pricing method regards investment activity as contingent decision, it helps to make judgment on the management flexibility of IT projects and then make a more reasonable evaluation about the IT programs. Lastly by being applied to a real case, it further illustrates the effectiveness and feasibility of the model

    Serum neurofilament dynamics predicts neurodegeneration and clinical progression in presymptomatic Alzheimer's disease

    Get PDF
    Neurofilament light chain (NfL) is a promising fluid biomarker of disease progression for various cerebral proteopathies. Here we leverage the unique characteristics of the Dominantly Inherited Alzheimer Network and ultrasensitive immunoassay technology to demonstrate that NfL levels in the cerebrospinal fluid (n = 187) and serum (n = 405) are correlated with one another and are elevated at the presymptomatic stages of familial Alzheimer's disease. Longitudinal, within-person analysis of serum NfL dynamics (n = 196) confirmed this elevation and further revealed that the rate of change of serum NfL could discriminate mutation carriers from non-mutation carriers almost a decade earlier than cross-sectional absolute NfL levels (that is, 16.2 versus 6.8 years before the estimated symptom onset). Serum NfL rate of change peaked in participants converting from the presymptomatic to the symptomatic stage and was associated with cortical thinning assessed by magnetic resonance imaging, but less so with amyloid-β deposition or glucose metabolism (assessed by positron emission tomography). Serum NfL was predictive for both the rate of cortical thinning and cognitive changes assessed by the Mini-Mental State Examination and Logical Memory test. Thus, NfL dynamics in serum predict disease progression and brain neurodegeneration at the early presymptomatic stages of familial Alzheimer's disease, which supports its potential utility as a clinically useful biomarker
    • …
    corecore